Gibbs sampling for parsimonious Markov models with latent variables

نویسندگان

  • Ralf Eggeling
  • Pierre-Yves Bourguignon
  • André Gohr
  • Ivo Grosse
چکیده

Parsimonious Markov models have been recently developed as a generalization of variable order Markov models. Many practical applications involve a setting with latent variables, with a common example being mixture models. Here, we propose a Bayesian model averaging approach for learning mixtures of parsimonious Markov models that is based on Gibbs sampling. The challenging problem is sampling one out of a large number of model structures. We solve it by an efficient dynamic programming algorithm. We apply the resulting Gibbs sampling algorithm to splice site classification, an important problem from computational biology, and find the Bayesian approach to be superior to the non-Bayesian classification.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gibbs Sampling for Bayesian Non-Conjugate and Hierarchical Models by Using Auxiliary Variables

We demonstrate the use of auxiliary (or latent) variables for sampling non-standard densities which arise in the context of the Bayesian analysis of non-conjugate and hierarchical models by using a Gibbs sampler. Their strategic use can result in a Gibbs sampler having easily sampled full conditionals. We propose such a procedure to simplify or speed up the Markov chain Monte Carlo algorithm. T...

متن کامل

Gibbs Sampling for the Probit Regression Model with Gaussian Markov Random Field Latent Variables

We consider a binary probit model where the latent variable follow a Gaussian Markov Random Field (GMRF). Our main objective is to derive an efficient Gibbs sampler for the above model. For this purpose, we first review two Gibbs samplers available for the classical probit model with one latent variable. We find that the joint update of variables increases the rate of convergence. We use these ...

متن کامل

Parallel Implementation of Expectation-Maximization for Fast Convergence

Latent variable models have become popular tools for modeling data that potentially arise from latent causes. This class of models includes the unstructured Gaussian mixture model (GMM), temporally linked Hidden Markov Model (HMM), and the more complex latent Dirichlet allocation (LDA) [3]. Despite the varying latent variable structures, these models all poses challenging inference problem as t...

متن کامل

Learning Infinite Hidden Relational Models

Relational learning analyzes the probabilistic constraints between the attributes of entities and relationships. We extend the expressiveness of relational models by introducing for each entity (or object) an infinite-state latent variable as part of a Dirichlet process (DP) mixture model. It can be viewed as a relational generalization of hidden Markov random field. The information propagates ...

متن کامل

A Theoretical and Practical Implementation Tutorial on Topic Modeling and Gibbs Sampling

This technical report provides a tutorial on the theoretical details of probabilistic topic modeling and gives practical steps on implementing topic models such as Latent Dirichlet Allocation (LDA) through the Markov Chain Monte Carlo approximate inference algorithm Gibbs Sampling.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012